Web Survey Bibliography
Title Do not touch data from late-received online questionnaires: Analysis of late responder in a closed-pool online survey
Author Stieger, S., Voracek, M.
Year 2005
Access date 22.04.2005
Abstract In postal surveys paper-pencil questionnaires are provided only for a certain time-frame. Normally, data collection lasts for some days, weeks or months. However, it is not clear when to stop data collection and there are no guidelines. This is also of relevance from a methodological point of view. If the expected effect-size is low, large number of participants have to be asked in order to get statistically significant results. To get a large number of participants, either a large number of potential participants has to be recruited or the study has to be online for a certain time. The longer a study is online, the more likely statistical significance will be reached. However, over time the possibility of a negative influence through late-responder increases. In our study, we therefore investigated whether late-responder differ from early-responder. Participants in this online study were recruited via email (addresses were retrieved from a publicly accessible directory). After sending out the invitation for participation via email, the return rate decreased very quickly, but did never reach the zero-level. Even after one year, online questionnaires from participants were still received (late-responder). Based on 4554 online questionnaires, we show that, on average, after the 45th day of the online study (1) the number of participants which masked their sex (gender-switching) increased strongly, (2) the relative distribution of participants' faculty affiliation changed notably, (3) participants more often tried to ignore required fields, (4) the portion of participants clicking through the online survey stayed constant, but the portion of questions clicked through increased, (5) the questionnaire completion time decreased and, at the same time, less questions were answered, and (6) the page-specific dropout rate increased.
Abstract - optional Üblicherweise werden Paper-Pencil-Fragebögen in postalischen Umfragen in einem zeitlich begrenzten Rahmen vorgegeben, d.h., die Erhebungsphase beträgt einige Tage, Wochen oder Monate. Generell ist jedoch nicht klar, wann eine solche Erhebung abgebrochen werden soll bzw. gibt es dafür keine Richtwerte. Diese Frage ist nicht zuletzt auch methodisch von Bedeutung. Um auch bei kleinen erwarteten Effektgrößen zu statistisch signifikanten Ergebnissen zu kommen, ist es notwendig, eine große Anzahl an Versuchspersonen zu befragen. Um zu einer großen Anzahl an Versuchspersonen zu gelangen, muss entweder eine große Anzahl an Personen rekrutiert werden oder die Studie entsprechend lange laufen. Je länger eine Studie läuft, umso eher wird statistische Signifikanz erreicht, jedoch auch umso wahrscheinlicher ist eine mögliche negative Beeinflussung der Datenqualität durch Late Responder. In dieser Online-Studie untersuchten wir deshalb, ob Versuchspersonen, welche spät auf eine Teilnahmeaufforderung reagieren, sich von jenen unterscheiden, welche zu Beginn der Studie teilgenommen haben. Die Versuchspersonen in dieser Online-Studie wurden via Email rekrutiert (die Adressen stammten aus einem öffentlichen Adressbuch). Nach der Aussendung zeigte sich, dass zwar der Rücklauf sehr schnell absank, jedoch nie ganz auf Null zurückging. Selbst nach mehr als einem Jahr nach Untersuchungsbeginn trafen noch immer Fragebögen von Versuchspersonen ein, die eine Teilnahmeaufforderung erhalten hatten (Late-Responder). Anhand einer Analyse von 4554 Fragebögen kann gezeigt werden, dass im Durchschnitt nach dem 45. Tag des Beginns der Online-Studie (1) die Anzahl der Versuchspersonen, welche ihr Geschlecht maskierten (Gender Switching), stark anstieg, (2) die relative Fakultätsverteilung der Versuchsteilnehmer sich stark veränderte, (3) die Versuchspersonen vermehrt versuchten, Required-Fragen zu umgehen, (4) der Anteil an Durchklicker zwar gleich blieb, jedoch die Anzahl durchgeklickter Fragen stark stieg, (5) die Ausfüllzeit des Fragebogens sank, bei gleichzeitig weniger ausgefüllten Fragen und (6) der seitenspezifische Dropout stark zunahm.
Access/Direct link Homepage - conference (abstract)
Year of publication2005
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.